Better copy/import

Поиск
Список
Период
Сортировка
От Steven Lane
Тема Better copy/import
Дата
Msg-id v03007803b781d5ebe403@[65.15.153.184]
обсуждение исходный текст
Ответ на Re: error status 139  (Tom Lane <tgl@sss.pgh.pa.us>)
Ответы Re: Better copy/import  (Gary Stainburn <gary.stainburn@ringways.co.uk>)
Список pgsql-admin
Hello all:

Sorry for the bad subject line on the last version of this post.

I'm trying to load about 10M rows of data into a simple postgres table. The
data is straightforward and fairly clean, but does have glitches every few
tens of thousands of rows. My problem is that when COPY hits a bad row it
just aborts, leaving me to go back, delete or clean up the row and try
again.

Is there any way to import records that could just skip the bad ones and
notify me which ones they are? Loading this much data is pretty
time-consuming, especially when I keep having to repeat it to find each new
bad row. Is there a better way?

-- sgl



В списке pgsql-admin по дате отправления:

Предыдущее
От: lobet_romuald@my-deja.com (Romuald Lobet)
Дата:
Сообщение: Re: What CASE tools and clients for Postgres?
Следующее
От: Steven Lane
Дата:
Сообщение: Re: error status 139